1. The Architecture of State Transitions
Consider the logic of weather. If we assume today's rain is the only factor influencing tomorrow, we enter the realm of Markovian dynamics. This is elegantly captured in EXAMPLE 2a:
This creates a transition matrix $P$ where we can calculate the future probability flow using the Chapman-Kolmogorov Identity:
$$P_{ij}^{(2)} = \sum_{k=0}^{M} P_{kj}P_{ik}$$
2. The Rhythm of Arrival
Randomness is not just about where we go, but when events occur. In a Poisson process, we track discrete arrivals (like earthquakes or radioactive decay) over time.
- Interarrival Times: For a Poisson process, let $T_1$ denote the time the first event occurs. For $n > 1$, let $T_n$ denote the time elapsed between the $(n-1)$st and the $n$th event.
- Stationarity: The sequence $\{T_n, n=1, 2, \ldots\}$ consists of independent exponential variables, dictated by the rate $\lambda$.
3. Information as the Reduction of Surprise
Information theory, pioneered by Claude Shannon, quantifies uncertainty. It rests on a beautiful algebraic foundation, specifically Axiom 4:
Axiom 4: $S(pq) = S(p) + S(q)$ for $0 < p \le 1, 0 < q \le 1$
This axiom implies that the surprise of two independent events is the sum of their individual surprises, leading directly to the definition of Shannon Entropy:
$$H(X) = -\sum_{i=1}^n p_i \log_2(p_i)$$